Hierarchical Mixtures of Experts and the EM Algorithm
نویسندگان
چکیده
We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM’s). Learning is treated as a maximum likelihood problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parameters of the architecture. We also develop an on-line learning algorithm in which the parameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.
منابع مشابه
Convergence results for the EM approach to mixtures of experts architectures
The Expectation-Maximization (EM) algorithm is an iterative approach to maximum likelihood parameter estimation. Jordan and Jacobs (1994) recently proposed an EM algorithm for the mixture of experts architecture of Jacobs, Jordan, Nowlan and Hinton (1991) and the hierarchical mixture of experts architecture of Jordan and Jacobs (1992). They showed empirically that the EM algorithm for these arc...
متن کاملHierarchical mixtures of experts methodology applied to continuous speech recognition
In this paper, we incorporate the Hierarchical Mixtures of Experts (HME) method of probability estimation, developed by Jordan [1], into an HMMbased continuous speech recognition system. The resulting system can be thought of as a continuous-density HMM system, but instead of using gaussian mixtures, the HME system employs a large set of hierarchically organized but relatively small neural netw...
متن کاملAdaptively Growing Hierarchical Mixtures of Experts
We propose a novel approach to automatically growing and pruning Hierarchical Mixtures of Experts . The constructive algorithm proposed here enables large hierarchies consisting of several hundred experts to be trained effectively. We show that HME's trained by our automatic growing procedure yield better generalization performance than traditional static and balanced hierarchies. Evaluation of...
متن کاملTime Series Modeling via Hierarchical Mixtures
We address the problem of model comparison and model mixing in time series using the approach known as Hierarchical Mixtures-of-Experts. Our methodology allows for comparisons of arbitrary models, not restricted to a particular class or parametric form. Additionally, the approach is flexible enough to incorporate exogenous information that can be summarized in terms of covariables or simply tim...
متن کاملA hierarchical mixture model for software reliability prediction
It is important to develop general prediction models in current software reliability research. In this paper, we propose a hierarchical mixture of software reliability models (HMSRM) for software reliability prediction. This is an application of the hierarchical mixtures of experts (HME) architecture. In HMSRM, individual software reliability models are used as experts. During the training of H...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural Computation
دوره 6 شماره
صفحات -
تاریخ انتشار 1994